Dependency-based Convolutional Neural Networks for Sentence Embedding

نویسندگان

  • Mingbo Ma
  • Liang Huang
  • Bowen Zhou
  • Bing Xiang
چکیده

In sentence modeling and classification, convolutional neural network approaches have recently achieved state-of-the-art results, but all such efforts process word vectors sequentially and neglect long-distance dependencies. To exploit both deep learning and linguistic structures, we propose a tree-based convolutional neural network model which exploit various long-distance relationships between words. Our model improves the sequential baselines on all three sentiment and question classification tasks, and achieves the highest published accuracy on TREC.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Attention-Based Convolutional Neural Network for Semantic Relation Extraction

Nowadays, neural networks play an important role in the task of relation classification. In this paper, we propose a novel attention-based convolutional neural network architecture for this task. Our model makes full use of word embedding, part-of-speech tag embedding and position embedding information. Word level attention mechanism is able to better determine which parts of the sentence are m...

متن کامل

Graph Convolutional Networks with Argument-Aware Pooling for Event Detection

The current neural network models for event detection have only considered the sequential representation of sentences. Syntactic representations have not been explored in this area although they provide an effective mechanism to directly link words to their informative context for event detection in the sentences. In this work, we investigate a convolutional neural network based on dependency t...

متن کامل

Tree-based Convolution: A New Neural Architecture for Sentence Modeling

This paper proposes a new convolutional neural architecture based on treestructures, called the tree-based convolutional neural network (TBCNN). Two variants take advantage of constituency trees and dependency trees, respectively, to model sentences. Compared with traditional “flat” convolutional neural networks (CNNs), TBCNNs explore explicitly sentences’ structural information; compared with ...

متن کامل

Dependency Parsing with Dilated Iterated Graph CNNs

Dependency parses are an effective way to inject linguistic knowledge into many downstream tasks, and many practitioners wish to efficiently parse sentences at scale. Recent advances in GPU hardware have enabled neural networks to achieve significant gains over the previous best models, these models still fail to leverage GPUs’ capability for massive parallelism due to their requirement of sequ...

متن کامل

The Conference on Empirical Methods in Natural Language Processing Proceedings of the 2nd Workshop on Structured Prediction for Natural Language Processing

Dependency parses are an effective way to inject linguistic knowledge into many downstream tasks, and many practitioners wish to efficiently parse sentences at scale. Recent advances in GPU hardware have enabled neural networks to achieve significant gains over the previous best models, these models still fail to leverage GPUs’ capability for massive parallelism due to their requirement of sequ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015